Convergence of Conjugate Gradient Methods with a Closed-Form Stepsize Formula
نویسندگان
چکیده
Conjugate gradient methods are efficient methods for minimizing differentiable objective functions in large dimension spaces. However, converging line search strategies are usually not easy to choose, nor to implement. Sun and colleagues (Ann. Oper. Res. 103:161–173, 2001; J. Comput. Appl. Math. 146:37–45, 2002) introduced a simple stepsize formula. However, the associated convergence domain happens to be overrestrictive, since it precludes the optimal stepsize in the convex quadratic case. Here, we identify this stepsize formula with one iteration of the Weiszfeld algorithm in the scalar case. More generally, we propose to make use of a finite number of iterates of such an algorithm to compute the stepsize. In this framework, we establish a new convergence domain, that incorporates the optimal stepsize in the convex quadratic case.
منابع مشابه
Global Convergence of the Dai-Yuan Conjugate Gradient Method with Perturbations
In this paper, the authors propose a class of Dai-Yuan (abbr. DY) conjugate gradient methods with linesearch in the presence of perturbations on general function and uniformly convex function respectively. Their iterate formula is xk+1 = xk + αk(sk + ωk), where the main direction sk is obtained by DY conjugate gradient method, ωk is perturbation term, and stepsize αk is determined by linesearch...
متن کاملOPTML 2017:Variable Metric Proximal Gradient Method with Diagonal Borzilai-Borwein Stepsize
We propose a diagonal metric selection for variable metric proximal gradient method (VMPG). The proposed metric better captures the local geometry of the problem and provides improved convergence compared to the standard proximal gradient (PG) methods with Barzilai-Borwein (BB) stepsize selection. Further, we provide convergence guarantees for the proposed method and illustrate its advantages o...
متن کاملA conjugate gradient based method for Decision Neural Network training
Decision Neural Network is a new approach for solving multi-objective decision-making problems based on artificial neural networks. Using inaccurate evaluation data, network training has improved and the number of educational data sets has decreased. The available training method is based on the gradient decent method (BP). One of its limitations is related to its convergence speed. Therefore,...
متن کاملA New Hybrid Conjugate Gradient Method Based on Eigenvalue Analysis for Unconstrained Optimization Problems
In this paper, two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS}) conjugate gradient method are presented to solve unconstrained optimization problems. A remarkable property of the proposed methods is that the search direction always satisfies the sufficient descent condition independent of line search method, based on eigenvalue analysis. The globa...
متن کاملThe cyclic Barzilai–Borwein method for unconstrained optimization
In the cyclic Barzilai–Borwein (CBB) method, the same Barzilai–Borwein (BB) stepsize is reused for m consecutive iterations. It is proved that CBB is locally linearly convergent at a local minimizer with positive definite Hessian. Numerical evidence indicates that when m > n/2 3, where n is the problem dimension, CBB is locally superlinearly convergent. In the special case m = 3 and n = 2, it i...
متن کامل